Search Results: "tino"

20 May 2014

Laura Arjona: Some experiences, and TODO, about fonts

TL;DR I don t know much about fonts, I just use the stock ones that come with my system. From time to time I have issues with docs that others create, and use other fonts. This post is about my plans on learning a bit more and, at least, know how to solve those issues, if it s possible, while staying in the bright side (the free-software / free font side). Long version The context I use Debian, LibreOffice, sometimes Inkscape, and LaTeX. One of my favorite hashtags is #iloveplaintext, I don t know much about design in general, and fonts and typography in particular. I didn t change the fonts in my desktops (only reduced the size in the laptop, to be able to read a bit more in my low resolution screen), I rarely change the font in Writer (maybe from Liberation Serif to Liberation Sans), I never changed the font in a LaTeX document or LaTeX beamer presentation (I write boring documents, I know) and when I paste from the web, it s usually because I want to read a long article so I paste as plain text in gedit and print it or save it. So I ve never felt to learn more about fonts, it just works, and covers my needs, (or not, but I mostly could live with the issues). Then a friend in the Pump.io network, Adri n Perales, published a blog post about Typography (in Spanish) that I liked very much, and I began to think (and remember) some of the issues that I have from time to time with fonts. Issue #1: League Gothic: a free font that was not installed in my system Today, again in the Pump.io network, I discover that the FSF published a poster Privacy is impossible without free software , in SVG format, but it didn t look well when I opened it with GIMP, clearly due to some missing font. Nice that SVG format is a plain text format (XML)! So I opened the file with gedit and searched for the text string whose font was missing. It was League Gothic font, of course a free software font, but not packaged in Debian, it seems. No problem. I downloaded the font, copied the files in /usr/share/fonts and problem solved. Issue #2: Book Antiqua: a non-free font (must find equivalent) A document made with Microsoft Word that one friend sends to me so I review and resend (in PDF format) to other people. It s a leaflet, and it has text in Arial, in Tahoma, and in Book Antiqua. When I open it with LibreOffice the aspect is wrong (the substitutes are not the same size so some breaking lines and so). Book Antiqua is not free. I learned that it s an imitation of the Palatino font, and that a similar font in the free software systems is URW Palladio . In Writer (LibreOffice), I went to Tools > Options > Fonts and declared the equivalence of the two fonts, so the program would use URW Palladio as a substitute of Book Antiqua. I opened again the document and it was quite better, very similar to the original aspect. I didn t bother in changing the text in Arial or Tahoma, since the substitutes that LibreOffice used were quite good. But I bookmarked this page: A Web Designer Guide to Linux Fonts for remembering the different fonts that I can try to emulate the Windows ones. I also know that I can install the Microsoft Core Fonts for the web since they are packaged for Debian in the contrib archive. But I ll try to survive without them for now (until now, I didn t bother, why should I now that I have the substituting guide?). In other news, I got impressed that MS Core Fonts is #4 in Most downloads all over the time in SourceForge, with more than 450 million downloads \o/ Issue #3: Installing a new free font in Debian So I decided to install one of the fonts that Adri n Perales recommended in his blog post, Linux Libertine . Since it is packaged for Debian, it s super easy:
# apt-get install ttf-linux-libertine
(my LibreOffice was opened, so I closed it and opened again, and the font was there ready for use). Issue #4: Use a different font in LaTeX Well, as always, there is not one but many ways to do that in LaTeX. My intuition tells me that if there is a LaTeX package for the font that I want to use, it s probably a nice idea to just use it. So I searched about Linux Libertine in LaTeX and yes, there is a package (and you can find a very interesting font guide in The LaTeX font catalogue ). I installed the package texlive-fonts-extra, and then, I added two lines in my LaTeX document:
\usepackage libertine 
\usepackage[T1] fontenc 
Compiled, and the resultant PDF was using Libertine font instead of Computer Modern. TODO (and/or wishlist) When opening a document that uses a font not present in my system, I maybe wouldn t notice that a font is missing and I see a substitute (and maybe not the best one)!. It would be nice that the program tells the user This file uses the font X, and it seems it s not in your system. I ll use font Y as a substitute . I m not sure if there is a standard way to know which package contains a certain font. I use a web search engine to try to find out, and the websites that I linked in the article. I have to investigate and learn a bit more about free fonts equivalents to the ones that other people use, and fonts in general, so my documents are more beautiful and people gets interested to know about the tools that I use to produce them. Well, I ve written a long blog post (thanks if you read until here!), solved some issues, and try some things, but not even capturing a snapshot to show here! It seems that I m still lazy, forgive me I hope at least this #plaintext is useful for you :)
Filed under: My experiences and opinion, Uncategorized Tagged: Debian, English, Fonts, Free Software, LaTeX, libre software, LibreOffice, Moving into free software, pump.io

8 March 2013

Maximilian Attems: Wrong moves

A bizarre closed user-tracking schism is rebased from shaky grounds to an unexisting target. It is sad to see this continous decline due to unchallenged decisions.

18 December 2012

Petter Reinholdtsen: Ledger - double-entry accounting using text based storage format

A few days ago I came across a blog post from Joey Hess describing ledger and hledger, a text based system for double-entry accounting. I found it interesting, as I am involved with several organizations where accounting is an issue, and I have not really become too friendly with the different web based systems we use. I find it hard to find what I look for in the menus and even harder try to get sensible data out of the systems. Ledger seem different. The accounting data is kept in text files that can be stored in a version control system, and there are at least five different implementations able to read the format. An example entry look like this, and is simple enough that it will be trivial to generate entries based on CVS files fetched from the bank:
2004-05-27 Book Store
      Expenses:Books                 $20.00
      Liabilities:Visa
The concept seemed interesting enough for me to check it out and look for others using it. I found blog posts from Christine Spang, Pete Keen, Andrew Cantino and Ronald Ip describing how they use it, as well as a post from Bradley M. Kuhn at the Software Freedom Conservancy. All seemed like good recommendations fitting my need. The ledger package is available in Debian Squeeze, while the hledger package only is available in Debian Sid. As I use Squeeze, ledger seemed the best choice to get started. To get some real data to test on, I wrote a web scraper for LODO, the accounting system used by the NUUG association, and started to play with the data set. I'm not really deeply into accounting, but I am able to get a simple balance and accounting status for example using the "ledger balance" command. But I will have to gather more experience before I know if the ledger way is a good fit for the organisations I am involved in.

29 November 2012

Bernhard R. Link: Gulliver's Travels

After seeing some book descriptions recently on planet debian, let me add some short recommendation, too. Almost everyone has heard about Gulliver's Travels already, so usually only very cursory. For example: did you know the book describes 4 journeys and not only the travel to Lilliput? Given how influential the book has been, that is even more suprising. Words like "endian" or "yahoo" originate from it. My favorite is the third travel, though, especially the acadamy of Lagado, from which I want to share two gems: " His lordship added, 'That he would not, by any further particulars, prevent the pleasure I should certainly take in viewing the grand academy, whither he was resolved I should go.' He only desired me to observe a ruined building, upon the side of a mountain about three miles distant, of which he gave me this account: 'That he had a very convenient mill within half a mile of his house, turned by a current from a large river, and sufficient for his own family, as well as a great number of his tenants; that about seven years ago, a club of those projectors came to him with proposals to destroy this mill, and build another on the side of that mountain, on the long ridge whereof a long canal must be cut, for a repository of water, to be conveyed up by pipes and engines to supply the mill, because the wind and air upon a height agitated the water, and thereby made it fitter for motion, and because the water, descending down a declivity, would turn the mill with half the current of a river whose course is more upon a level.' He said, 'that being then not very well with the court, and pressed by many of his friends, he complied with the proposal; and after employing a hundred men for two years, the work miscarried, the projectors went off, laying the blame entirely upon him, railing at him ever since, and putting others upon the same experiment, with equal assurance of success, as well as equal disappointment.' " "I went into another room, where the walls and ceiling were all hung round with cobwebs, except a narrow passage for the artist to go in and out. At my entrance, he called aloud to me, 'not to disturb his webs.' He lamented 'the fatal mistake the world had been so long in, of using silkworms, while we had such plenty of domestic insects who infinitely excelled the former, because they understood how to weave, as well as spin.' And he proposed further, 'that by employing spiders, the charge of dyeing silks should be wholly saved;' whereof I was fully convinced, when he showed me a vast number of flies most beautifully coloured, wherewith he fed his spiders, assuring us 'that the webs would take a tincture from them; and as he had them of all hues, he hoped to fit everybody s fancy, as soon as he could find proper food for the flies, of certain gums, oils, and other glutinous matter, to give a strength and consistence to the threads.'"

20 June 2012

Michal Čihař: Rome in May

Finally I've found time to look back at vacation in Rome and post some pictures. It has been great time there and I've really enjoyed that. I would like to thank to Luigi Gangitano for showing us some interesting places in the city. And now back to pictures: Monumento a Vittorio Emanuale Spanish Steps Piramide Via Appia Antica Villa del Quintilli Castel Sant' Angello Arco di Constantino Colloseum

Filed under: English Photography Travelling 0 comments Flattr this!

8 June 2012

Lars Wirzenius: Obnam 1.0 (backup software); a story in many words

tl;dr: Version 1.0 of Obnam, my snapshotting, de-duplicating, encrypting backup program is released. See the end of this announcement for the details. Where we see the hero in his formative years; parental influence From the very beginning, my computing life has involved backups. In 1984, when I was 14, my father was an independent telecommunications consultant, which meant he needed a personal computer for writing reports. He bought a Luxor ABC-802, a Swedish computer with a Z80 microprocessor and two floppy drives. My father also taught me how to use it. When I needed to save files, he gave me not one, but two floppies, and explained that I should store my files one one, and then copy them to the other one every now and then. Later on, over the years, I've made backups from a hard disk (30 megabytes!) to a stack of floppies, to a tape drive installed into a floppy interface (400 megabytes!), to a DAT drive, and various other media. It was always a bit tedious. The start of the quest; lengthy justification for NIH In 2004, I decided to do a full backup, by burning a copy of all my files onto CD-R disks. It took me most of the day. Afterwards, I sat admiring the large stack of disks, and realized that I would not ever do that again. I'm too lazy for that. That I had done it once was an aberration in the space-time continuum. Switching to DVD-Rs instead CD-Rs would reduce to the number of disks to burn, but not enough: it would still take a stack of them. I needed something much better. I had a little experience with tape drives, and that was enough to convince me that I didn't want them. Tape drives are expensive hardware, and the tapes also cost money. If the drive goes bad, you have to get a compatible one, or all your backups are toast. The price per gigabyte was coming down fast for hard drives, and it was clear that they were about to be very competitive with tapes for price. I looked for backup programs that I could use for disk based backups. rsync, of course, was the obvious choice, but there were others. I ended up doing what many geeks do: I wrote my own wrapper around rsync. There's hundred, possibly thousands, of such wrappers around the Internet. I also got the idea that doing a startup to provide online backup space would be a really cool thing. However, I didn't really do anything about that until 2007. More on that later. The rsync wrapper script I wrote used hardlinked directory trees to provide a backup history, though not in the smart way that backuppc does it. The hardlinks were wonderful, because they were cheap, and provided de-duplication. They were also quite cumbersome, when I needed to move my backups to a new disk the first time. It turned out that a lot of tools deal very badly with directory trees with large numbers of hardlinks. I also decided I wanted encrypted backups. This led me to find duplicity, which is a nice program that does encrypted backups, but I had issues with some of its limitations. To fix those limitations, I would have had to re-design and possibly re-implement the entire program. The biggest limitation was that it treated backups as full backup, plus a sequence of incremental backups, which were deltas against the previous backup. Delta based incrementals make sense for tape drives. You run a full backup once, then incremental deltas for every day. When enough time has passed since the full backup, you do a new full backup, and then future incrementals are based on that. Repeat forever. I decided that this makes no sense for disk based backups. If I already have backed up a file, there's no point in making me backup it again, since it's already there on the same hard disk. It makes even less sense for online backups, since doing a new full backup would require me to transmit all the data all over again, even though it's already on the server. The first battle I could not find a program that did what I wanted to do, and like every good NIHolic, I started writing my own. After various aborted attempts, I started for real in 2006. Here is the first commit message:
revno: 1
committer: Lars Wirzenius <liw@iki.fi>
branch nick: wibbr
timestamp: Wed 2006-09-06 18:35:52 +0300
message:
  Initial commit.
wibbr was the placeholder name for Obnam until we came up with something better. We was myself and Richard Braakman, who was going to be doing the backup startup with me. We eventually founded the company near the end of 2006, and started doing business in 2007. However, we did not do very much business, and ran out of money in September 2007. We ended the backup startup experiment. That's when I took a job with Canonical, and Obnam became a hobby project of mine: I still wanted a good backup tool. In September 2007, Obnam was working, but it was not very good. For example, it was quite slow and wasteful of backup space. That version of Obnam used deltas, based on the rsync algorithm, to backup only changes. It did not require the user to do full and incremental backups manually, but essentially created an endless sequence of incrementals. It was possible to remove any generation, and Obnam would manage the deltas as necessary, keeping the ones needed for the remaining generations, and removing the rest. Obnam made it look as if each generation was independent of each other. The wasteful part was the way in which metadata about files was stored: each generation stored the full list of filenames and their permissions and other inode fields. This turned out to be bigger than my daily delta. The lost years; getting lost in the forest For the next two years, I did a little work on Obnam, but I did not make progress very fast. I changed the way metadata was stored, for example, but I picked another bad way of doing it: the new way was essentially building a tree of directory and file nodes, and any unchanged subtrees were shared between generations. This reduced the space overhead per generation, but made it quite slow to look up the metadata for any one file. The final battle; finding cows in the forest In 2009 I decided to leave Canonical and after that, my Obnam hobby picked up in speed again. Below is a table of the number of commits per year, from the very first commit (bzr log -n0 awk '/timestamp:/ print $3 ' sed 's/-.*//' uniq -c awk ' print $2, $1 ' tac):
2006 466
2007 353
2008 402
2009 467
2010 616
2011 790
2012 282
During most of 2010 and 2011 I was unemployed, and happily hacking Obnam, while moving to another country twice. I don't recommend that as a way to hack on hobby projects, but it worked for me. After Canonical, I decided to tackle the way Obnam stores data from a new angle. Richard told me about the copy-on-write (or COW) B-trees that btrfs uses, originally designed by Ohad Rodeh (see his paper for details), and I started reading about that. It turned out that they're pretty ideal for backups: each B-tree stores data about one generation. To start a new generation, you clone the previous generation's B-tree, and make any modifications you need. I implemented the B-tree library myself, in Python. I wanted something that was flexible about how and where I stored data, which the btrfs implementation did not seem to give me. (Also, I worship at the altar of NIH.) With the B-trees, doing file deltas from the previous generation no longer made any sense. I realized that it was, in any case, a better idea to store file data in chunks, and re-use chunks in different generations as needed. This makes it much easier to manage changes to files: with deltas, you need to keep a long chain of deltas and apply many deltas to reconstruct a particular version. With lists of chunks, you just get the chunks you need. The spin-off franchise; lost in a maze of dependencies, all alike In the process of developing Obnam, I have split off a number of helper programs and libraries: I have found it convenient to keep these split off, since I've been able to use them in other projects as well. However, it turns out that those installing Obnam don't like this: it would probably make sense to have a fat release with Obnam and all dependencies, but I haven't bothered to do that yet. The blurb; readers advised about blatant marketing The strong points of Obnam are, I think: Backups may be stored on local hard disks (e.g., USB drives), any locally mounted network file shares (NFS, SMB, almost anything with remotely Posix-like semantics), or on any SFTP server you have access to. What's not so strong is backing up online over SFTP, particularly with long round trip times to the server, or many small files to back up. That performance is Obnam's weakest part. I hope to fix that in the future, but I don't want to delay 1.0 for it. The big news; readers sighing in relief I am now ready to release version 1.0 of Obnam. Finally. It's been a long project, much longer than I expected, and much longer than was really sensible. However, it's ready now. It's not bug free, and it's not as fast as I would like, but it's time to declare it ready for general use. If nothing else, this will get more people to use it, and they'll find the remaining problems faster than I can do on my own. I have packaged Obnam for Debian, and it is in unstable, and will hopefully get into wheezy before the Debian freeze. I provide packages built for squeeze on my own repository, see the download page. The changes in the 1.0 release compared to the previous one: The future; not including winning lottery numbers I expect to get a flurry of bug reports in the near future as new people try Obnam. It will take a bit of effort dealing with that. Help is, of course, welcome! After that, I expect to be mainly working on Obnam performance for the foreseeable future. There may also be a FUSE filesystem interface for restoring from backups, and a continous backup version of Obnam. Plus other features, too. I make no promises about how fast new features and optimizations will happen: Obnam is a hobby project for me, and I work on it only in my free time. Also, I have a bunch of things that are on hold until I get Obnam into shape, and I may decide to do one of those things before the next big Obnam push. Where; the trail of an errant hacker I've developed Obnam in a number of physical locations, and I thought it might be interesting to list them: Espoo, Helsinki, Vantaa, Kotka, Raahe, Oulu, Tampere, Cambridge, Boston, Plymouth, London, Los Angeles, Auckland, Wellington, Christchurch, Portland, New York, Edinburgh, Manchester, San Giorgio di Piano. I've also hacked on Obnam in trains, on planes, and once on a ship, but only for a few minutes on the ship before I got seasick. Thank you; sincerely SEE ALSO

27 April 2012

Pietro Abate: Learning from the Future of Component Repositories - CBSE 2012

Learning from the Future of Component Repositories ( Pietro Abate, Roberto Di Cosmo, Ralf Treinen and Stefano Zacchiroli ) has been accepted to be presented at CBSE 2012 (26-28 June, Bertinoro, Italy)

Abstract
  An important aspect of the quality assurance of large component repositories
  is the logical coherence of component metadata. We argue that it is possible
  to identify certain classes of such problems by checking relevant properties
  of the possible future repositories into which the current
  repository may evolve. In order to make a complete analysis of all possible
  futures effective however, one needs a way to construct a finite set of
  representatives of this infinite set of potential futures. We define a class
  of properties for which this can be done.
  We illustrate the practical usefulness of the approach with two quality
  assurance applications: (i) establishing the amount of  forced upgrades''
  induced by introducing new versions of existing components in a repository,
  and (ii) identifying outdated components that need to be upgraded in order to
  ever be installable in the future. For both applications we provide
  experience reports obtained on the Debian distribution.
The tools presented in this paper (outdated and challenges) are already in Debian as part of the 'dose-extra' package.

9 January 2012

Steve McIntyre: Armhf buildds and status in Debian

Current status Back in September, I wrote about the machines that I set up to help bootstrap the new armhf port in Debian. Basing on Konstantinos' huge efforts in bringing up the new "architecture" in debian-ports, we started importing armhf into the main Debian archive on the 24th of November. Since then, those builders have been churning away night and day to build the huge collection of software that makes up the Debian archive. The current state can be seen on the armhf buildd status page, and there's a nice graph showing how quickly we've managed to run from 0 to over 90% of the archive here. (Click on the image for a larger version, or visit https://buildd.debian.org/stats/ for other versions. We overtook hurd-i386 quickly and are now ahead of the kfreebsd-* architectures. armhf bootstrap graph We've recently brought 3 more similar build machines online (hildegard, howells and hummel), again sponsored by the nice folks at Linaro but now hosted at the York NeuroImaging Centre at the University of York. This gives us both more build horsepower to keep up with building more different bits of Debian (experimental, updates etc.) and more redundancy in case of problems. We now have the vast majority of the archive built, and now a number of us are concentrating on fixing the remaining issues: language bootstraps and bugs. Also, on the 7th of January we were just added into testing, the next step on our path for inclusion as a Debian release architecture. Setting up the machines A lot of people have been asking me about the physical setup I showed in my last blog about these machines, so here's more details for those who are interested. Mount the 6 boards into the mini-rack, connect up the Molex power connectors to each board, attach ethernet cables and turn it all on! Each board comes with a micro-SD card containing uboot and an Ubuntu installation. I've configured uboot to boot off the hard drive directly, but leaving configuration available to use the Ubuntu on the micro-SD as a simple rescue system should the need arise. The Quickstart boards are not ideal physically for two reasons: the lack of SATA power, plus you need to push a power button on each board to boot it - they don't boot automatically the moment power is applied. However, they're quite inexpensive little machines and have done a great job of building the Debian archive so far! The ideal machines for us would also include more RAM at this point. CPU on these is adequate, but the larger C++ packages (yay webkit!) use a huge amount of memory at link time. Linking in swap is not the best thing, performance-wise... :-( UPDATE 2012-01-12: Ian tells me that the newer Quickstart-R boards apparently have a different power controller; these now boot up straight away without needing you to push a button. That sounds useful.

15 June 2011

Christian Perrier: So, what happened with Kikithon?

I mentioned this briefly yesterday, but now I'll try to summarize the story of a great surprise and a big moment for me. All this started when my wife Elizabeth and my son Jean-Baptiste wanted to do something special for my 50th birthday. So, it indeed all started months ago, probably early March or something (I don't yet have all the details). Jean-Baptiste described this well on the web site, so I won't go again into details, but basically, this was about getting birthday wishes from my "free software family" in, as you might guess, as many languages as possible. Elizabeth brought the original idea and JB helped her by setting up the website and collecting e-mail addresses of people I usually work with: he grabbed addresses from PO files on Debian website, plus some in his own set of GPG signatures and here we go. And then he started poking dozens of you folks in order to get your wishes for this birthday. Gradually, contributions accumulated on the website, with many challenges for them: be sure to get as many people as possible, poking and re-poking all those FLOSS people who keep forgetting things... It seems that poking people is something that's probably in the Perrier's genes! And they were doing all this without me noticing. As usually in Debian, releasing on time is a no-no. So, it quickly turned out that having everything ready by April 2nd wouldn't be possible. So, their new goal was offering this to me on Pentecost Sunday, which was yesterday. And...here comes the gift. Aha, this looks like a photo album. Could it be a "50 years of Christian" album? But, EH, why is that pic of me, with the red Debconf5 tee-shirt (that features a world map) and a "bubulle" sign, in front of the book? But, EH EH EH, what the .... are doing these word by H0lger, then Fil, then Joey doing on the following pages? And only then, OMG, I discover the real gift they prepared. 106, often bilingual, wishes from 110 people (some were couples!). 18 postcards (one made of wood). 45 languages. One postcard with wishes from nearly every distro representatives at LinuxTag 2011. Dozens of photos from my friends all around the world. All this in a wonderful album. I can't tell what I said. Anyway, JB was shooting a video, so...we'll see. OK, I didn't cry...but it wasn't that far and emotion was really really intense. Guys, ladies, gentlemen, friends....it took me a while to realize what you contributed to. It took me the entire afternoon to realize the investment put by Elizabeth and JB (and JB's sisters support) into this. Yes, as many of you wrote, I have an awesome family and they really know how to share their love. I also have an awesome virtual family all around the world. Your words are wholeheartedly appreciated and some were indeed much much much appreciated. Of course, I'll have the book in Banja Luka so that you can see the result. I know (because JB and Elizabeth told me) that many of you were really awaiting to see how it would be received (yes, that includes you, in Germany, who I visited in early May!!!). Again, thank you so much for this incredible gift. Thank you Holger Levsen, Phil Hands, Joey Hess, Lior Kaplan, Martin Michlmayr, Alberto Gonzalez Iniesta, Kenshi "best friend" Muto, Praveen Arimbrathodiyil, Felipe Augusto van de Wiel, Ana Carolina Comandulli (5 postcards!), Stefano Zacchiroli (1st contribution received by JB, of course), Gunnar Wolf, Enriiiiiico Zini, Clytie Siddall, Frans Pop (by way of Clytie), Tenzin Dendup, Otavio Salvador, Neil McGovern, Konstantinos Margaritis, Luk Claes, Jonas Smedegaard, Pema Geyleg, Meike "sp tzle queen" Reichle, Alexander Reichle-Schmehl, Torsten Werner, "nette BSD" folks, CentOS Ralph and Brian, Fedora people, SUSE's Jan, Ubuntu's Lucia Tamara, Skolelinux' Paul, Rapha l Hertzog, Lars Wirzenius, Andrew McMillan (revenge in September!), Yasa Giridhar Appaji Nag (now I know my name in Telugu), Amaya Rodrigo, St phane Glondu, Martin Krafft, Jon "maddog" Hall (and God save the queen), Eddy Petri or, Daniel Nylander, Aiet Kolkhi, Andreas "die Katze geht in die K che, wunderbar" Tille, Paul "lets bend the elbow" Wise, Jordi "half-marathon in Banja Luka" Mallach, Steve "as ever-young as I am" Langasek, Obey Arthur Liu, YAMANE Hideki, Jaldhar H. Vyas, Vikram Vincent, Margarita "Bronx cross-country queen" Manterola, Patty Langasek, Aigars Mahinovs (finding a pic *with* you on it is tricky!), Thepittak Karoonboonyanan, Javier "nobody expects the Spanish inquisition" Fern ndez-Sanguino, Varun Hiremath, Moray Allan, David Moreno Garza, Ralf "marathon-man" Treinen, Arief S Fitrianto, Penny Leach, Adam D. Barrat, Wolfgang Martin Borgert, Christine "the mentee overtakes the mentor" Spang, Arjuna Rao Chevala, Gerfried "my best contradictor" Fuchs, Stefano Canepa, Samuel Thibault, Eloy "first samba maintainer" Par s, Josip Rodin, Daniel Kahn Gillmor, Steve McIntyre, Guntupalli Karunakar, Jano Gulja , Karolina Kali , Ben Hutchings, Matej Kova i , Khoem Sokhem, Lisandro "I have the longest name in this list" Dami n Nicanor P rez-Meyer, Amanpreet Singh Alam, H ctor Or n, Hans Nordhaugn, Ivan Mas r, Dr. Tirumurti Vasudevan, John "yes, Kansas is as flat as you can imagine" Goerzen, Jean-Baptiste "Piwet" Perrier, Elizabeth "I love you" Perrier, Peter Eisentraut, Jesus "enemy by nature" Climent, Peter Palfrader, Vasudev Kamath, Miroslav "Chicky" Ku e, Mart n Ferrari, Ollivier Robert, Jure uhalev, Yunqiang Su, Jonathan McDowell, Sampada Nakhare, Nayan Nakhare, Dirk "rendez-vous for Chicago marathon" Eddelbuettel, Elian Myftiu, Tim Retout, Giuseppe Sacco, Changwoo Ryu, Pedro Ribeoro, Miguel "oh no, not him again" Figueiredo, Ana Guerrero, Aur lien Jarno, Kumar Appaiah, Arangel Angov, Faidon Liambotis, Mehdi Dogguy, Andrew Lee, Russ Allbery, Bj rn Steensrud, Mathieu Parent, Davide Viti, Steinar H. Gunderson, Kurt Gramlich, Vanja Cvelbar, Adam Conrad, Armi Be irovi , Nattie Mayer-Hutchings, Joerg "dis shuld be REJECTed" Jaspert and Luca Capello. Let's say it gain:

13 June 2011

Christian Perrier: So, what happened with Kikithon?

I mentioned this briefly yesterday, but now I'll try to summarize the story of a great surprise and a big moment for me. All this started when my wife Elizabeth and my son Jean-Baptiste wanted to do something special for my 50th birthday. So, it indeed all started months ago, probably early March or something (I don't yet have all the details). Jean-Baptiste described this well on the web site, so I won't go again into details, but basically, this was about getting birthday wishes from my "free software family" in, as you might guess, as many languages as possible. Elizabeth brought the original idea and JB helped her by setting up the website and collecting e-mail addresses of people I usually work with: he grabbed addresses from PO files on Debian website, plus some in his own set of GPG signatures and here we go. And then he started poking dozens of you folks in order to get your wishes for this birthday. Gradually, contributions accumulated on the website, with many challenges for them: be sure to get as many people as possible, poking and re-poking all those FLOSS people who keep forgetting things... It seems that poking people is something that's probably in the Perrier's genes! And they were doing all this without me noticing. As usually in Debian, releasing on time is a no-no. So, it quickly turned out that having everything ready by April 2nd wouldn't be possible. So, their new goal was offering this to me on Pentecost Sunday, which was yesterday. And...here comes the gift. Aha, this looks like a photo album. Could it be a "50 years of Christian" album? But, EH, why is that pic of me, with the red Debconf5 tee-shirt (that features a world map) and a "bubulle" sign, in front of the book? But, EH EH EH, what the .... are doing these word by H0lger, then Fil, then Joey doing on the following pages? And only then, OMG, I discover the real gift they prepared. 106, often bilingual, wishes from 110 people (some were couples!). 18 postcards (one made of wood). 45 languages. One postcard with wishes from nearly every distro representatives at LinuxTag 2011. Dozens of photos from my friends all around the world. All this in a wonderful album. I can't tell what I said. Anyway, JB was shooting a video, so...we'll see. OK, I didn't cry...but it wasn't that far and emotion was really really intense. Guys, ladies, gentlemen, friends....it took me a while to realize what you contributed to. It took me the entire afternoon to realize the investment put by Elizabeth and JB (and JB's sisters support) into this. Yes, as many of you wrote, I have an awesome family and they really know how to share their love. I also have an awesome virtual family all around the world. Your words are wholeheartedly appreciated and some were indeed much much much appreciated. Of course, I'll have the book in Banja Luka so that you can see the result. I know (because JB and Elizabeth told me) that many of you were really awaiting to see how it would be received (yes, that includes you, in Germany, who I visited in early May!!!). Again, thank you so much for this incredible gift. Thank you Holger Levsen, Phil Hands, Joey Hess, Lior Kaplan, Martin Michlmayr, Alberto Gonzalez Iniesta, Kenshi "best friend" Muto, Praveen Arimbrathodiyil, Felipe Augusto van de Wiel, Ana Carolina Comandulli (5 postcards!), Stefano Zacchiroli (1st contribution received by JB, of course), Gunnar Wolf, Enriiiiiico Zini, Clytie Siddall, Frans Pop (by way of Clytie), Tenzin Dendup, Otavio Salvador, Neil McGovern, Konstantinos Margaritis, Luk Claes, Jonas Smedegaard, Pema Geyleg, Meike "sp tzle queen" Reichle, Alexander Reichle-Schmehl, Torsten Werner, "nette BSD" folks, CentOS Ralph and Brian, Fedora people, SUSE's Jan, Ubuntu's Lucia Tamara, Skolelinux' Paul, Rapha l Hertzog, Lars Wirzenius, Andrew McMillan (revenge in September!), Yasa Giridhar Appaji Nag (now I know my name in Telugu), Amaya Rodrigo, St phane Glondu, Martin Krafft, Jon "maddog" Hall (and God save the queen), Eddy Petri or, Daniel Nylander, Aiet Kolkhi, Andreas "die Katze geht in die K che, wunderbar" Tille, Paul "lets bend the elbow" Wise, Jordi "half-marathon in Banja Luka" Mallach, Steve "as ever-young as I am" Langasek, Obey Arthur Liu, YAMANE Hideki, Jaldhar H. Vyas, Vikram Vincent, Margarita "Bronx cross-country queen" Manterola, Patty Langasek, Aigars Mahinovs (finding a pic *with* you on it is tricky!), Thepittak Karoonboonyanan, Javier "nobody expects the Spanish inquisition" Fern ndez-Sanguino, Varun Hiremath, Moray Allan, David Moreno Garza, Ralf "marathon-man" Treinen, Arief S Fitrianto, Penny Leach, Adam D. Barrat, Wolfgang Martin Borgert, Christine "the mentee overtakes the mentor" Spang, Arjuna Rao Chevala, Gerfried "my best contradictor" Fuchs, Stefano Canepa, Samuel Thibault, Eloy "first samba maintainer" Par s, Josip Rodin, Daniel Kahn Gillmor, Steve McIntyre, Guntupalli Karunakar, Jano Gulja , Karolina Kali , Ben Hutchings, Matej Kova i , Khoem Sokhem, Lisandro "I have the longest name in this list" Dami n Nicanor P rez-Meyer, Amanpreet Singh Alam, H ctor Or n, Hans Nordhaugn, Ivan Mas r, Dr. Tirumurti Vasudevan, John "yes, Kansas is as flat as you can imagine" Goerzen, Jean-Baptiste "Piwet" Perrier, Elizabeth "I love you" Perrier, Peter Eisentraut, Jesus "enemy by nature" Climent, Peter Palfrader, Vasudev Kamath, Miroslav "Chicky" Ku e, Mart n Ferrari, Ollivier Robert, Jure uhalev, Yunqiang Su, Jonathan McDowell, Sampada Nakhare, Nayan Nakhare, Dirk "rendez-vous for Chicago marathon" Eddelbuettel, Elian Myftiu, Tim Retout, Giuseppe Sacco, Changwoo Ryu, Pedro Ribeoro, Miguel "oh no, not him again" Figueiredo, Ana Guerrero, Aur lien Jarno, Kumar Appaiah, Arangel Angov, Faidon Liambotis, Mehdi Dogguy, Andrew Lee, Russ Allbery, Bj rn Steensrud, Mathieu Parent, Davide Viti, Steinar H. Gunderson, Kurt Gramlich, Vanja Cvelbar, Adam Conrad, Armi Be irovi , Nattie Mayer-Hutchings, Joerg "dis shuld be REJECTed" Jaspert and Luca Capello. Let's say it gain:

30 April 2011

Thomas Girard: ACE+TAO Debian packaging moved to git

We recently converted Debian ACE+TAO package repository from Subversion to git. This was a long and interesting process; I learned a lot on git in the course. I had been using git for a while for other packages: BOUML, dwarves and GNU Smalltalk. But I did not really get it. A preliminary study led by Pau[1] showed that out of the following three tools: the last one was giving results that look better.
The conversion svn-all-fast-export requires physical access to the repo, so the Alioth SVN repo was copied on my machine svn-pkg-ace/ before running the tool:
svn-all-fast-export --identity-map authors.txt --rules pkg-ace.rules svn-pkg-ace
Here's the content of the pkg-ace.rules configuration file that was used:
create repository pkg-ace
end repository
match /trunk/
  repository pkg-ace
  branch master
end match
match /(branches tags)/([^/]+)/
  repository pkg-ace
  branch \2
end match
The author mapping file authors.txt being:
markos = Konstantinos Margaritis <email-hidden>
mbrudka-guest = Marek Brudka <email-hidden>
pgquiles-guest = Pau Garcia i Quiles <email-hidden>
tgg = Thomas Girard <email-hidden>
tgg-guest = Thomas Girard <email-hidden>
The tool sample configuration file merged-branches-tags.rules recommends to post-process tags, which are just a branch in SVN. That's why the configuration file above treats branches as tags. The conversion was indeed fast: less than 1 minute.
Post-conversion observations Invoking gitk --all in the converted repo revealed different kind of issues:
  • svn tags as branches: http://thomas.g.girard.free.fr/ACE/tags-as-branches.png Branches are marked with green rectangles, and tags with yellow arrows. What we have here (expected given our configuration of the tool) are branches (e.g. 5.4.7-5) corresponding to tags, and tags matching the SVN tagging commit (e.g. backups/5.4.7-5@224). We'll review and fix this.

  • merged code that did not appear as such: http://thomas.g.girard.free.fr/ACE/missing-merge-metadata.png Branches that were not merged using svn merge look like they were not merged at all.

  • commits with wrong author: http://thomas.g.girard.free.fr/ACE/wrong-author.png Before being in SVN, the repository was stored in CVS. When it was imported into SVN, no special attention was given to the commit author. Hence I got credited for changes I did not write.

  • obsolete branches: http://thomas.g.girard.free.fr/ACE/obsolete-branches.png The tool leaves all branches, including removed ones (with tag on their end) so that you can decide what to do with them.

  • missing merges: http://thomas.g.girard.free.fr/ACE/missing-merge.png The branch 5.4.7-12 was never merged into the trunk!

Learning git Based on observations above, I realized my limited knowledge won't do to complete the conversion and clean the repository. There are tons of documentation on git out there, and you can find a lot of links from the git documentation page. Here's the one I've used:
The Git Object Model It's described with pictures here. You really need to understand this if you haven't already. Once you do, you understand that git is built bottom-up: the plumbing then the porcelain. If you can't find the tool you need, it's easy to write it.
git fast-import The Migrating to Git chapter explains how you can use the git fast-import tool to manually import anything into git. I've used it to create tags with dates in the past, slightly changing the Custom Importer example in the book:
#!/usr/bin/env ruby
#
# retag.rb
#
# Small script to create an annotated tag, specifying commiter as well as
# date, and tag comment.
#
# Based on Scott Chacon "Custom Importer" example.
#
# Arguments:
#  $1 -- tag name
#  $2 -- sha-1 revision to tag
#  $3 -- committer in the form First Last <email>
#  $4 -- date to use in the form YYYY/MM/DD_HH:MM:SS

def help
  puts "Usage: retag <tag> <sha1sum> <committer> <date> <comment>"
  puts "Creates a annotated tag with name <tag> for commit <sha1sum>, using "
  puts "given <committer>, <date> and <comment>"
  puts "The output should be piped to git fast-import"
end
def to_date(datetime)
  (date, time) = datetime.split('_')
  (year, month, day) = date.split('/')
  (hour, minute, second) = time.split(':')
  return Time.local(year, month, day, hour, minute, second).to_i
end
def generate_tag(tag, sha1hash, committer, date, message)
  puts "tag # tag "
  puts "from # sha1hash "
  puts "tagger # committer  # date  +0000"
  print "data # message.size \n# message "
end
if ARGV.length != 5
  help
  exit 1
else
  (tag, sha1sum, committer, date, message) = ARGV
  generate_tag(tag, sha1sum, committer, to_date(date), message)
end
graft points (graft means greffe in French) Because of missing svn:mergeinfo some changes appear unmerged. To fix this there are graft points: they override git idea of parents of a commit. To create a graft point, assuming 6a6d48814d0746fa4c9f6869bd8d5c3bc3af8242 is the commit you want to change, currently with a single parent 898ad49b61d4d8d5dc4072351037e2c8ade1ab68, but containing changes from commit 11cf74d4aa996ffed7c07157fe0780ec2224c73e:
me@mymachine$ echo 6a6d48814d0746fa4c9f6869bd8d5c3bc3af8242 11cf74d4aa996ffed7c07157fe0780ec2224c73e 898ad49b61d4d8d5dc4072351037e2c8ade1ab68 >> .git/info/grafts
git filter-branch git filter-branch allows you to completely rewrite history of a git branch, changing or dropping commits while traversing the branch. As an additional benefit, this tool use graft points and make them permanent. In other words: after running git filter-branch you can remove .git/info/grafts file. I've used it to rewrite author of a given set of commits, using a hack on top of Chris Johnsen script:
#!/bin/sh

br="HEAD"
TARG_NAME="Raphael Bossek"
TARG_EMAIL="hidden"
export TARG_NAME TARG_EMAIL
filt='
    if test "$GIT_COMMIT" = 546db1966133737930350a098057c4d563b1acdf -o \
            "$GIT_COMMIT" = 23419dde50662852cfbd2edde9468beb29a9ddcc; then
        if test -n "$TARG_EMAIL"; then
            GIT_AUTHOR_EMAIL="$TARG_EMAIL"
            export GIT_AUTHOR_EMAIL
        else
            unset GIT_AUTHOR_EMAIL
        fi
        if test -n "$TARG_NAME"; then
            GIT_AUTHOR_NAME="$TARG_NAME"
            export GIT_AUTHOR_NAME
        else
            unset GIT_AUTHOR_NAME
        fi
    fi
'
git filter-branch $force --tag-name-filter cat --env-filter "$filt" -- $br
(Script edited here; there were much more commits written by Raphael.)

Important

It's important to realize that the whole selected branch history is rewritten, so all objects id will change. You should not do this if you already published your repository.

The --tag-name-filter cat argument ensures our tags are copied during the traversal; otherwise they would be untouched, and hence not available in the new history.

Hint

Once git filter-branch completes you get a new history, as well as a new original ref to ease comparison. It is highly recommended to check the result of the rewrite before removing original. To shrink the repo after this, git clone the rewritten repo with file:// syntax -- git-filter-branch says it all.

Cleaning up the repo To recap, here's how the ACE+TAO git repo was changed after conversion:
  1. Add graft points where needed.

  2. Clean tags and branches. Using git tag -d, git branch -d and the Ruby script above it was possible to recreate tags. During this I was also able to add missing tags, and remove some SVN errors I did -- like committing in a branch created under tags/.

  3. Remove obsolete branches.

  4. Merge missing pieces. There were just two missing debian/changelog entries. I did this before git filter-branch because I did not find a way to use the tool correctly with multiple heads.

  5. Fix commit author where needed. Using the shell script above Raphael is now correctly credited for his work.

That's it. The ACE+TAO git repository for Debian packages is alive at http://git.debian.org/?p=pkg-ace/pkg-ace.git;a=summary.
[1]http://lists.alioth.debian.org/pipermail/pkg-ace-devel/2011-March/002421.html
[2]available in Debian as svn-all-fast-export

20 August 2010

Riku Voipio: (unnofficial) Bits from ARM porters

Quite a few things have happened recently in the Debian/ARM land

ARM and Canonical have generously provided us with bunch of fast armel machines. Four of these are now as buildd's, bringing the total of armel buildd's to 7. In other words, armel should no longer lag behind other architectures when building unstable packages.

One of machines, abel.debian.org has been setup as a porter box. It is faster (around 3x) than the older porterbox (agricola). All Debian Developers have access to the porterboxes.

The new buildds have allowed us to enable more suites. Thanks to Philipp Kerns work, armel builds now experimental, lenny-volatile, lenny-backports as well as unstable/non-free. Especially if you are using stable Debian, access to backports and volatile should make life happier :)

Finally, the next big thing is Hardfloat ARM port, effort being lead by Konstantinos Margaritis. This doesn't mean that the armel port is going away. Majority of ARM cpus sold are still without FPU, so the softloaf port (armel) will still have a long life ahead. Meanwhile, the armhf port will provide a more optimal platform for people with bleeding edge ARM cores (ARMv7 + vfp). Some people have been unhappy with the new proposed new port, and various alternatives have been proposed. However, armhf is currently the only solution being actively worked on.

Update: thank canonical too

8 June 2010

Adrian von Bidder: Blue Velvet

A somewhat na ve young man, a kidnapping, a policeman's daughter, a night club singer, said policeman's partner being involved, a love triangle, a pervert and (probably because it's Lynch) a cut off ear. What more do you need ... ? Or, in other words, I absolutely regret not having seen Blue Velvet before. In contrast, Pink Flamingos takes its fight over the title of the filthiest woman alive (which sets the whole plot in motion) a bit too literal for my taste. I like movies that leave more things up to the imagination of the spectator. Or perhaps it's just that my sense of aesthetics doesn't agree with John Water's. (And to the person who knew nothing bettter to say than You idiot to my negative critique about Tarantino's Inglourious Basterds: I don't delete comments on my blog that don't agree with me. But I do not tolerate personal attacks. Post it to your own blog. You're as entitled to your opinion as I am.)

6 May 2010

Adrian von Bidder: Inglourious Basterds

Whew. Thank God I did download that one and not spend money on it. If you're trying to poke fun (or be satirical) at the Nazis, do it like Chaplin, or do it like Monty Python. Inglourious Basterds is just not funny and hasn't got enough substance to be satire. It's just fooling around by people with too big an ego and too much money. I'm extremely disappointed with Tarantino here; I liked his other movies and had expected quite a bit more here. I'm not surprised to see Brad Pitt in such a movie (he's not a bad actor, but I've always had the impression that he's not picky about the roles he's playing), and Uma Thurman gets bonus points for not appearing :-) It just occurs to me that as well as completely missing the mark with the plot here, Tarantino also utterly fails at assembling a gripping soundtrack. Coincidence? If we ignore all this, the film is at least well made, and most of the actors show a good performance (I particularly like Christoph Waltz being the complete asshole.) (On second thought: now, he didn't miss the mark with the plot, but with the way he's been filming it. Use the same plot but do it as a slapstick comedy, I think it could have been really funny. Use the same basic plot but take it seriously, it might have been a real thrilling war movie. Etc. So the plot is not to blame.)

11 February 2010

Joachim Breitner: Diploma Thesis Finished

Earlier today, I went to a local copy shop and had my diploma thesis printed. This afternoon, I will hand it in. The title is Loop subgroups of Fr and the images of their stabilizer subgroups in GLr( ) and discusses a group-theoretical result. I assume that very few readers care about the content of the thesis, but maybe some are interested in a few assorted LaTeX hints. I m also publishing the full TeX source code, maybe someone can make use of it. Less chatty varioref I m using the varioref package, in conjunction with the cleveref package. This provides a command \vref fig:S3S4l which will expand to, for example, to Figure 1 on page 11 . But if the referenced figure is actually on the current page, the next page, the previous page or the facing page (in two-side layouts), it will say so: Figure 1 on this page. This is very nice, but I assume that the reader of my thesis is able to find Figure 1 when it is visible, i.e. on the current or facing page. One can remove the referencing texts with the commands \def\reftextfaceafter , \def\reftextfacebefore and \def\reftextcurrent . But because varioref puts a space between Figure 1 and this text, we will get a superfluous space even before punctuation. The remedy is the command \unskip, which removes this space again. So I use in my preamble:
\def\reftextfaceafter  \unskip %
\def\reftextfacebefore \unskip %
\def\reftextcurrent    \unskip %
Palatino and extra leading I chose the Palatino font for my thesis, using the mathpazo package. Various sources (such as the KOMA-Script manual) suggest to use 5% extra leading:
\linespread 1.05 
Counting figures independently from chapters I don t have too many figures and tables in my thesis, and I want them to be numbered simple 1, 2, ... By default, LaTeX would say 1.1, 1.2, 2.1, ... This can be fixed using the remreset package and these commands:
\makeatletter
\@removefromreset figure chapter 
\renewcommand \thefigure \arabic figure 
\@removefromreset table chapter 
\renewcommand \thetable \arabic table 
\makeatother
No widows and club lines LaTeX already avoids these, but I wanted to get rid of them completely. This can be done with:
% Disable single lines at the start of a paragraph (Schusterjungen)
\clubpenalty = 10000
% Disable single lines at the end of a paragraph (Hurenkinder)
\widowpenalty = 10000 \displaywidowpenalty = 1000
Struck table lines I had to typeset tables with some lines struck, and I could not find a ready command for that. I used the following definition, based on the code for \hline. Note that it probably does not adjust well to other font sizes and needs to be adjusted manually:
\makeatletter
\def\stline %
  \noalign \vskip-.7em\vskip-\arrayrulewidth\hrule \@height \arrayrulewidth\vskip.7em 
\makeatother
Title page in one-sided layout According to the KOMA manual, the title page as set by LaTeX is not meant to be the cover of a publication, and therefore has to be set with the margins of a right page i.e. a larger right margin and a smaller left margin. But when printing cheaply, one often just put a transparent sheet on top of the print, so the title page is the cover. You can convince KOMA that you are right by using
\KOMAoptions twoside=false 
\begin titlepage 
...
\end titlepage 
\KOMAoptions twoside=true 
Not flushing the page for chapter heads LaTeX would put the list of algorithms on a new right page. I found this a waste of paper for my few algorithms, and preferred to put the list right after the table of contents. You can override the LaTeX behavior using:
\tableofcontents
 
\let\cleardoublepage\relax  % book
\let\clearpage\relax  % report
\let\chapter\section
\listofalgorithms
 
This code also reduces the size of the heading to that of a section. The same trick also works with \chapter. Math in headings vs. PDF bookmarks LaTeX with the hyperref package creates nice PDF bookmarks from your chapter and section titles. Unfortunately, PDF bookmark names can only be plain strings, while the titles in the document might contain some math symbols. You can make both happy with \texorpdfstring:
\section Stabilizer subgroups in \texorpdfstring $\GL_r(\Z/2\Z)$ GL\_r(Z/2Z) 
Setting lines for the signature The diploma thesis contains a small note which I have to sign, saying that I created it on my own etc. Below that, I put two labeled lines for date and signature, using the tabbing environment:
\begin tabbing 
\rule 4cm .4pt \hspace 1cm  \= \rule 7cm .4pt  \\
Ort, Datum \> Unterschrift
\end tabbing 

24 January 2010

Russell Coker: Preventing Children from Accessing Porn

The following was written by Stefano Cosentino in regard to the ongoing efforts of the Australian government to censor the Internet with protecting the children as an excuse. All these Internet filtering ideas that have been in the news lately has made me voice my own opinion on the matter as a non-expert. I m an IT advisor. I take someone s problem and help them fix it. I have a few clients who provide laptops to their students, everything is done with these laptops. The students have no books. The school provides laptops to their primary school students as well as their high school students. They have done this long before the public system started to hand out laptops to a select number of high school students. When you provide a child with anything, there are always areas where a child will find that you may have overlooked. In fact, a young kid will probably find a host of things that you might have totally missed or didn t ever know about. One of these things is the inappropriate nature of information you may find that are associated with computers. This can be anything. But specifically, what the filtering argument has been about has been leaning towards Internet pornography and I would imagine, more specifically content of a pedophiliac nature. I m not against child pornography being banned or filtered. I personally think this is one of the most cruel, inconsiderate, disrespectful and self centered behaviors a person could display. Their psychological makeup isn t the scope of this article. However these ideas must be conveyed when discussing the Internet as a modern technological device that can be used for both good and bad. The primary school students that I attend to aren t very interested in this stuff and as it has been mentioned long ago by others who have joined this argument, are more interested in online flash games that include characters such as Ben 10, Pokemon and Yu-Gi-Oh. When I m called in to scan these computers during the school holidays and hand the laptops back to the kids or when they get handed back after 3 years for new ones, I find that the older primary school kids computers are usually more prone to the adult orientated content. To me this is the first sign of a filter s failure. This customer of mine spends more money on the filtering devices they use on campus than I charge them for a few days of work. Does it work? Sadly, I hate to report that no, it does not. It s completely useless. The kids still have files and traces of files on their computers that isn t suitable for young children. Not only pornography but also content of a violent or morbid nature. As young and impressionable people, everything they read, see and hear is absorbed. This shapes the way society will become in years to come. Since no one can realistically tell the future, wouldn t it be nice if we could make sure that the people looking after us and our lives forty or fifty years from now have a sane mind free of blemish? It s a different story for the high school children. While scanning their laptops I have to contact the police on a daily basis because of the nature of the content I find during my day. Some things you can let slip, today s version of a pinup girl, or a provocative pic of whatever skimpy clad girl the record companies are flogging off these days as musicians. But sadly that s rare. I won t go into detail of what I find on these laptops of 14 year old boys, but it ranges from some innocent growing up curiosity right to perverted, sick and most if not all of the time, illegal content. The filters fail again. The kids find a way around it. And they find it easily. I ve seen filters work and not work at all with young kids right through to young adults. All the filters do is either hinder the poor kids actually trying to do research of scholastic nature or prolong the inevitable and temporarily block a determined child s interest in the search for some adult related material. The filter might prevent accidental viewing but it doesn t stop the deliberate finding of pornography and other illegal content. How does a filter stop this from happening? How does a filter stop a child taking their parents adult videos and copying them to their laptop or finding dad s stash of Penthouse? How does it stop a school mate bringing this stuff to school to show everyone at lunch time or to trade for other content they found by other means. Remember back to when you were their age and caught a glimpse of your big brother s room wall. How many times did you try and catch a peak at that Samantha Fox poster hanging off the wall? Where s the filter now? Here s some thing to think about. The filter shouldn t be a thing, it should be a person. They re called responsible . They re called parents. What priorities do parents have if their child feels that what they look at online that is of an adult nature is acceptable? Or maybe the kid knows better, knows it isn t acceptable but still goes out of their way to get the stuff on their computer? Sneaker Net still exists, USB memory sticks are cheap and can now have two or three straight DVD rips on them, or perhaps five or six encoded films on there. Hundreds and thousands of images and so on. Filter failure again. When the kids go online, they know of the technology used to block them from gaining access to what they want to see. Chances are, they ll know what a proxy server is and does. Then they ll figure out what they need to do to get around the filter. I, myself did this back in high school and TAFE when I couldn t find photographs of a particular device I was researching. Turns out the name is also a form of sexual activity, in another language, but still. The filter stopped me from not only looking the offending content but also to look at the legitimate data that I needed to complete an assignment. I got around the problem by researching some more information and the following day I was breaking through firewalls and proxy servers with easy. Filter failure. How do I get around this issue when speaking to younger kids that need guidance and knowledge on how to deal with this situation? I hold talks at the school I provide my services to. I talk to the parents, no kids. The talk costs less than a broken filter they keep throwing money at keep up-to-date. The school puts these filters in place to appear responsible, because while the kids are attending their school, the school is in fact responsible. In fact, there is nothing more the school can do. They could educate the children, but you can tell someone what to do, and the chances of them doing it are pretty dismal. Music is not allowed on their computers either. Yet we constantly find iTunes on there and a host of music that traces to certain peer to peer applications where they acquired the stolen music. If a kid can learn how to do that, imagine what sort of influence can be placed on them from a more positive angle. Like maybe parents providing an explanation for starters of what it is they re looking at. What it is they ll find online. What material is inappropriate. What material should you tell an adult about. Why do I get stupid emails with Russian girls wanting to marry me. Kids absorb everything. Parents have relegated responsibility but not delegated it. This filter idea might help slow down a child s enthusiasm to learn about everything, both good and bad. But educating the kids from an early stage in life about morals and the modern world where lets think about it, we have absolutely everything we need and want at our finger tips will be more valuable than any filter. But the fact that we have so much available makes it difficult to say what is and isn t appropriate for a child to see. It is up to us to inform the children of what s out there is the world. It may or may not stop them from seeing the adult related content, but it will help them respond to it in a mature and adult manner. We all know, kids aren t stupid. So, if the filters worked, why am I called in once a year, every year to give my talk to parents?

23 November 2009

Mart&iacute;n Ferrari: Movies

Just wanted to share comments on some movies I've watched recently. Tags: Planet Lugfi, Planet Debian

20 November 2009

Gunnar Wolf: EDUSOL almost over - Some highlights

Whew! Is it karma or what? What makes me get involved in two horribly complex, two-week-long conferences, year after year? Of course, both (DebConf and EDUSOL) are great fun to be part of, and both have greatly influenced both my skills and interests. Anyway, this is the fifth year we hold EDUSOL. Tomorrow we will bring the two weeks of activities to an end, hold the last two videoconferences, and finally declare it a done deal. I must anticipate the facts and call it a success, as it clearly will be recognized as such. One of the most visible although we insist, not the core activities of the Encounter are the videoconferences. They are certainly among the most complex. And the videoconferences' value is greatly enhanced because, even if they are naturally a synchronous activity (it takes place at a given point in time), they live on after they are held: I do my best effort to publish them as soon as possible (less than one day off), and they are posted to their node, from where comments can continue. This was the reason, i.e., why we decided to move at the last minute tomorrow's conference: Due to a misunderstanding, Beatriz Busaniche (a good friend of ours and a very reknown Argentinian Free Software promotor, from Via Libre) thought her talk would be held today, and we had programmed her for tomorrow. No worries - We held it today, and it is already online for whoever wants to take part :-) So, I don't want to hold this any longer (I will link to the two conferences that I'm still missing from this same entry). Here is the list of (and links to) videoconferences we have held.
Tuesday 2009-11-17
Wednesday 2009-11-18
Thursday 2009-11-19
Friday 2009-11-20
As two last notes: Regarding the IRC interaction photos I recently talked about, we did a very kewl thing: Take over 2000 consecutive photos and put them together on a stack. Flip them one at a time. What do you get? But of course A very fun to view and interesting interaction video! We have to hand-update it and it is a bit old right now, but nevertheless, it is very interesting as it is. Finally... I must publicly say I can be quite an asshole. And yes, I know I talked this over privately with the affected people and they hold no grudge against me... But still - yesterday we had an IRC talk about NING Latin American Moodlers, by Luc a Osuna (Venezuela) and Maryel Mendiola (Mexico). One of the points they raised was they were working towards (and promoting) a Moodle certification. And... Yes, I recognize I cannot hear the mention of the certification word without jumping and saying certifications are overrated. Well, but being tired, and not being really thoughtful... I should have known where to stop, where it was enough of a point made. I ended up making Maryel and Luc a feel attacked during their own presentation, and that should have never happened. A public and heartfelt apology to them :-(

28 October 2009

Eduardo Marcel Macan: Latinoware: A minha menina (fontes)

A palestra sobre produ o musical com software livre no latinoware 2009 foi um sucesso. Muita gente pediu pra ter a musiquinha que produzimos para ilustrar a palestra, e que <noindex>Cesar Brod</noindex> cantou com tanta desenvoltura :) Ent o aqui segue o arquivo de projeto da m sica A minha menina que foi escolhida em homenagem a <noindex>minha namorada</noindex> e porque Brod queria algo MPB. Eu precisava de algo que pudesse vagamente se tornar parecido com Technopop/Electro. A m sica foi editada com <noindex>LMMS</noindex> (Linux MultiMedia Studio) 0.3.2 e poder ser carregada em qualquer vers o igual ou superior, pois foi feita exclusivamente com plugins e samples padr o da distribui o do LMMS. Apesar do nome, o LMMS est dispon vel para windows tamb m. O que voc est esperando? <noindex>Baixe daqui o LMMS</noindex> e o arquivo original da m sica daqui. P.S: A prop sito, a m sica n o foi masterizada e os volumes est o ajustados segundo os par metros do meu laptop, fone de ouvido e da sala l do latinoware. Brinque com os volumes de cada trilha para ajust -la a seu equipamento.

1 October 2009

Ingo Juergensmann: Debian Unstable - nomen est omen

Oh well... in the last days my machine crashed during the nightly backup. When I restarted it the next morning I realized two problems: the LVM is not recognized anymore and my X crashes on startup.
With the help of Tino Keitel (scorpi) I fixed the first issue by installing an older version of the lvm2 package (namely: lvm2_2.02.39-7_i386.deb) and libdevmapper1.02.1 (namely: libdevmapper1.02.1_1.02.37-1_i386.deb). Tino reported this in Bug #546817.

But there's still the Xorg problem left. Xorg.0.log reports:

CODE:
(II) RADEON(0): RADEONRestoreMemMapRegisters() :
(II) RADEON(0): MC_FB_LOCATION : 0xefffe800 0xefffe800
(II) RADEON(0): MC_AGP_LOCATION : 0xf87ff800
restore common
restore crtc1
restore pll1
finished PLL1
set RMX
set FP1
enable FP1
disable primary dac
disable TV
(II) RADEON(0): RandR 1.2 enabled, ignore the following RandR disabled message.

Backtrace:
0: X(xorg_backtrace+0x3b) [0x81314bb]
1: X(xf86SigHandler+0x51) [0x80c57d1]
2: [0xb7f32400]
3: X(xf86DiDGAInit+0x2f) [0x80f08cf]
4: X(xf86CrtcScreenInit+0x110) [0x80ed9e0]
5: /usr/lib/xorg/modules/drivers//radeon_drv.so [0xb7a2a114]
6: X(AddScreen+0x19d) [0x80712cd]
7: X(InitOutput+0x206) [0x80adfa6]
8: X(main+0x1db) [0x80719bb]
9: /lib/i686/cmov/libc.so.6(__libc_start_main+0xe5) [0xb7bb37a5]
10: X [0x8071051]

Fatal server error:
Caught signal 11. Server aborting


Maybe this is Bug #546586? I've already detached the second display, but it still crashes. Pointers and solutions appreciated!

UPDATE: thanks for the points! X is now working again!

Next.

Previous.